discrimination law
Discrimination laws must change to cover the impact of AI bias
Discrimination laws must be adapted to consider the impact artificial intelligence algorithms have on certain groups, new research has found. The paper from the Oxford Internet Institute says that AI systems are exhibiting bias against groups not protected under current legislation, and that governments should consider updating laws to reflect this. In the study, published today in the journal'Tulane Law Review', author Professor Sandra Wachter of the Oxford Internet Institute argues that something as simple as the web browser you use, how fast you type or whether you sweat during an interview can lead to AI making a negative decision about you. She says current discrimination laws don't adequately combat the type of bias exhibited by artificial intelligence, because there are specific categories of people that receive unfair outcomes, including over loan decisions, job applications and funding requests, who fall outside of the "protected groups" covered by discrimination legislation. Discrimination linked to AI can happen in even ordinary situations without the individual even knowing an AI made the final call, says Professor Wachter in her paper.
- Government (0.78)
- Law > Statutes (0.62)